An optimal gradient method for smooth strongly convex minimization
نویسندگان
چکیده
We present an optimal gradient method for smooth strongly convex optimization. The is in the sense that its worst-case bound on distance to point exactly matches lower oracle complexity class of problems, meaning no black-box first-order can have a better guarantee without further assumptions problems at hand. In addition, we provide constructive recipe obtaining algorithmic parameters and illustrate it be used deriving methods other optimality criteria as well.
منابع مشابه
Alternating Proximal Gradient Method for Convex Minimization
In this paper, we propose an alternating proximal gradient method that solves convex minimization problems with three or more separable blocks in the objective function. Our method is based on the framework of alternating direction method of multipliers. The main computational effort in each iteration of the proposed method is to compute the proximal mappings of the involved convex functions. T...
متن کاملMaking Gradient Descent Optimal for Strongly Convex Stochastic Optimization
Stochastic gradient descent (SGD) is a simple and popular method to solve stochastic optimization problems which arise in machine learning. For strongly convex problems, its convergence rate was known to be O(log(T )/T ), by running SGD for T iterations and returning the average point. However, recent results showed that using a different algorithm, one can get an optimal O(1/T ) rate. This mig...
متن کاملA coordinate gradient descent method for ℓ1-regularized convex minimization
In applications such as signal processing and statistics, many problems involve finding sparse solutions to under-determined linear systems of equations. These problems can be formulated as a structured nonsmooth optimization problems, i.e., the problem of minimizing `1-regularized linear least squares problems. In this paper, we propose a block coordinate gradient descent method (abbreviated a...
متن کاملA Bundle Method for Solving Convex Non-smooth Minimization Problems
Numerical experiences show that bundle methods are very efficient for solving convex non-smooth optimization problems. In this paper we describe briefly the mathematical background of a bundle method and discuss practical aspects for the numerical implementation. Further, we give a detailed documentation of our implementation and report about numerical tests.
متن کاملBeyond the regret minimization barrier: an optimal algorithm for stochastic strongly-convex optimization
We give a novel algorithm for stochastic strongly-convex optimization in the gradient oracle model which returns an O( 1 T )-approximate solution after T gradient updates. This rate of convergence is optimal in the gradient oracle model. This improves upon the previously known best rate of O( log(T ) T ), which was obtained by applying an online strongly-convex optimization algorithm with regre...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematical Programming
سال: 2022
ISSN: ['0025-5610', '1436-4646']
DOI: https://doi.org/10.1007/s10107-022-01839-y